Weighted Sums of Random Kitchen Sinks: Replacing minimization with randomization in learning
ثبت نشده
چکیده
Randomized neural networks are immortalized in this AI Koan: In the days when Sussman was a novice, Minsky once came to him as he sat hacking at the PDP-6. “What are you doing?” asked Minsky. “I am training a randomly wired neural net to play tic-tac-toe,” Sussman replied. “Why is the net wired randomly?” asked Minsky. Sussman replied, “I do not want it to have any preconceptions of how to play.” Minsky then shut his eyes. “Why do you close your eyes?” Sussman asked his teacher. “So that the room will be empty,” replied Minsky. At that moment, Sussman was enlightened. We analyze shallow random networks with the help of concentration of measure inequalities. Specifically, we consider architectures that compute a weighted sum of their inputs after passing them through a bank of arbitrary randomized nonlinearities. We identify conditions under which these networks exhibit good classification performance, and bound their test error in terms of the size of the dataset and the number of random nonlinearities.
منابع مشابه
Weighted Sums of Random Kitchen Sinks: Replacing minimization with randomization in learning
Randomized neural networks are immortalized in this AI Koan: In the days when Sussman was a novice, Minsky once came to him as he sat hacking at the PDP-6. “What are you doing?” asked Minsky. “I am training a randomly wired neural net to play tic-tac-toe,” Sussman replied. “Why is the net wired randomly?” asked Minsky. Sussman replied, “I do not want it to have any preconceptions of how to play...
متن کاملWeighted Sums of Random Kitchen Sinks: Replacing minimization with randomization in learning
Randomized neural networks are immortalized in this well-known AI Koan: In the days when Sussman was a novice, Minsky once came to him as he sat hacking at the PDP-6. “What are you doing?” asked Minsky. “I am training a randomly wired neural net to play tic-tac-toe,” Sussman replied. “Why is the net wired randomly?” asked Minsky. Sussman replied, “I do not want it to have any preconceptions of ...
متن کاملAsymptotic Behavior of Weighted Sums of Weakly Negative Dependent Random Variables
Let be a sequence of weakly negative dependent (denoted by, WND) random variables with common distribution function F and let be other sequence of positive random variables independent of and for some and for all . In this paper, we study the asymptotic behavior of the tail probabilities of the maximum, weighted sums, randomly weighted sums and randomly indexed weighted sums of heavy...
متن کاملStrong Laws for Weighted Sums of Negative Dependent Random Variables
In this paper, we discuss strong laws for weighted sums of pairwise negatively dependent random variables. The results on i.i.d case of Soo Hak Sung [9] are generalized and extended.
متن کاملStrong Convergence of Weighted Sums for Negatively Orthant Dependent Random Variables
We discuss in this paper the strong convergence for weighted sums of negatively orthant dependent (NOD) random variables by generalized Gaussian techniques. As a corollary, a Cesaro law of large numbers of i.i.d. random variables is extended in NOD setting by generalized Gaussian techniques.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2008